Pruning with Minimum Description Length

نویسنده

  • Jon Sporring
چکیده

The number of parameters in a model and its ability to generalize on the underlying data-generating machinery are tightly coupled entities. Neural networks consist usually of a large number of parameters, and pruning (the process of setting single parameters to zero) has been used to reduce the nets complexity in order to increase its generalization ability. Another less obvious approach is to use Minimum Description Length (MDL) to increase generalization. MDL is the only model selection criterion giving a uniform treatment of a) the complexity of the model and b) how well the model ts a speciic data set. This article investigates pruning based on MDL, and it is shown that the derived algorithm results in a scheme identical to the well known Optimal Brain Damage pruning. Furthermore, an example is given on a well known benchmark data set yielding ne results.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Pruning Fuzzy ARTMAP using the Minimum Description Length Principle in Learning from Clinical Databases

Fuzzy ARTMAP is one of the families of the neural network architectures bused on ART(Adaptive Resonance Theory) in which supervised learning can be curried out. However, it usually tends to create more categories than are actually needed. This often causes the so culled overfitting problem, namely the performunce of the networks in test set is not monotonically increasing with the additional tr...

متن کامل

Unsupervised Transduction Grammar Induction via Minimum Description Length

We present a minimalist, unsupervised learning model that induces relatively clean phrasal inversion transduction grammars by employing the minimum description length principle to drive search over a space defined by two opposing extreme types of ITGs. In comparison to most current SMT approaches, the model learns a very parsimonious phrase translation lexicons that provide an obvious basis for...

متن کامل

An Evolutionary Multi-objective Neural Network Optimizer with Bias-Based Pruning Heuristic

Neural network design aims for high classification accuracy and low network architecture complexity. It is also known that simultaneous optimization of both model accuracy and complexity improves generalization while avoiding overfitting on data. We describe a neural network training procedure that uses multi-objective optimization to evolve networks which are optimal both with respect to class...

متن کامل

Cross-Validation and Minimum Generation Error based Decision Tree Pruning for HMM-based Speech Synthesis

This paper presents a decision tree pruning method for the model clustering of HMM-based parametric speech synthesis by cross-validation (CV) under the minimum generation error (MGE) criterion. Decision-tree-based model clustering is an important component in the training process of an HMM based speech synthesis system. Conventionally, the maximum likelihood (ML) criterion is employed to choose...

متن کامل

Adjusting for Multiple Testing in Decision Tree Pruning

Over tting is a widely observed pathology of induction algorithms. Over tted models contain unnecessary structure that re ects nothing more than chance variations in the particular data sample used to construct the model. Portions of these models are literally wrong, and can mislead users. Over tted models require more storage space and take longer to execute than their correctlysized counterpa...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1995